46 research outputs found

    A Case Study on Computational Hermeneutics: E. J. Lowe’s Modal Ontological Argument

    Get PDF
    Computers may help us to better understand (not just verify) arguments. In this article we defend this claim by showcasing the application of a new, computer-assisted interpretive method to an exemplary natural-language ar- gument with strong ties to metaphysics and religion: E. J. Lowe’s modern variant of St. Anselm’s ontological argument for the existence of God. Our new method, which we call computational hermeneutics, has been particularly conceived for use in interactive-automated proof assistants. It aims at shedding light on the meanings of words and sentences by framing their inferential role in a given argument. By employing automated theorem reasoning technology within interactive proof assistants, we are able to drastically reduce (by several orders of magnitude) the time needed to test the logical validity of an argu- ment’s formalization. As a result, a new approach to logical analysis, inspired by Donald Davidson’s account of radical interpretation, has been enabled. In computational hermeneutics, the utilization of automated reasoning tools ef- fectively boosts our capacity to expose the assumptions we indirectly commit ourselves to every time we engage in rational argumentation and it fosters the explicitation and revision of our concepts and commitments

    Semantical Investigations on Non-classical Logics with Recovery Operators: Negation

    Full text link
    We investigate mathematical structures that provide a natural semantics for families of (quantified) non-classical logics featuring special unary connectives, called recovery operators, that allow us to 'recover' the properties of classical logic in a controlled fashion. These structures are called topological Boolean algebras. They are Boolean algebras extended with additional unary operations, called operators, such that they satisfy particular conditions of a topological nature. In the present work we focus on the paradigmatic case of negation. We show how these algebras are well-suited to provide a semantics for some families of paraconsistent Logics of Formal Inconsistency and paracomplete Logics of Formal Undeterminedness, which feature recovery operators used to earmark propositions that behave 'classically' in interaction with non-classical negations. In contrast to traditional semantical investigations, carried out in natural language (extended with mathematical shorthand), our formal meta-language is a system of higher-order logic (HOL) for which automated reasoning tools exist. In our approach, topological Boolean algebras become encoded as algebras of sets via their Stone-type representation. We employ our higher-order meta-logic to define and interrelate several transformations on unary set operations (operators), which naturally give rise to a topological cube of opposition. Furthermore, our approach allows for a uniform characterization of propositional, first-order and higher-order quantification (also restricted to constant and varying domains). With this work we want to make a case for the utilization of automated theorem proving technology for doing computer-supported research in non-classical logics. All presented results have been formally verified (and in many cases obtained) using the Isabelle/HOL proof assistant

    Value-Oriented Legal Argumentation in Isabelle/HOL

    Get PDF

    Logics of Formal Inconsistency enriched with replacement: an algebraic and modal account

    Get PDF
    One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics

    A Computational-Hermeneutic Approach for Conceptual Explicitation

    Full text link
    We present a computer-supported approach for the logical analysis and conceptual explicitation of argumentative discourse. Computational hermeneutics harnesses recent progresses in automated reasoning for higher-order logics and aims at formalizing natural-language argumentative discourse using flexible combinations of expressive non-classical logics. In doing so, it allows us to render explicit the tacit conceptualizations implicit in argumentative discursive practices. Our approach operates on networks of structured arguments and is iterative and two-layered. At one layer we search for logically correct formalizations for each of the individual arguments. At the next layer we select among those correct formalizations the ones which honor the argument's dialectic role, i.e. attacking or supporting other arguments as intended. We operate at these two layers in parallel and continuously rate sentences' formalizations by using, primarily, inferential adequacy criteria. An interpretive, logical theory will thus gradually evolve. This theory is composed of meaning postulates serving as explications for concepts playing a role in the analyzed arguments. Such a recursive, iterative approach to interpretation does justice to the inherent circularity of understanding: the whole is understood compositionally on the basis of its parts, while each part is understood only in the context of the whole (hermeneutic circle). We summarily discuss previous work on exemplary applications of human-in-the-loop computational hermeneutics in metaphysical discourse. We also discuss some of the main challenges involved in fully-automating our approach. By sketching some design ideas and reviewing relevant technologies, we argue for the technological feasibility of a highly-automated computational hermeneutics.Comment: 29 pages, 9 figures, to appear in A. Nepomuceno, L. Magnani, F. Salguero, C. Bar\'es, M. Fontaine (eds.), Model-Based Reasoning in Science and Technology. Inferential Models for Logic, Language, Cognition and Computation, Series "Sapere", Springe

    Computer-assisted Reconstruction and Assessment of E. J. Lowe's Modal Ontological Argument

    Get PDF
    Computers may help us to understand --not just verify-- philosophical arguments. By utilizing modern proof assistants in an iterative interpretive process, we can reconstruct and assess an argument by fully formal means. Through the mechanization of a variant of St. Anselm's ontological argument by E. J. Lowe, which is a paradigmatic example of a natural-language argument with strong ties to metaphysics and religion, we offer an ideal showcase for our computer-assisted interpretive method

    Types, Tableaus and Gödel's God in Isabelle/HOL

    Get PDF
    A computer-formalisation of the essential parts of Fitting's textbook "Types, Tableaus and Gödel's God" in Isabelle/HOL is presented. In particular, Fitting's (and Anderson's) variant of the ontological argument is verified and confirmed. This variant avoids the modal collapse, which has been criticised as an undesirable side-effect of Kurt Gödel's (and Dana Scott's) versions of the ontological argument. Fitting's work is employing an intensional higher-order modal logic, which we shallowly embed here in classical higher-order logic. We then utilize the embedded logic for the formalisation of Fitting's argument. (See also the earlier AFP entry ``Gödel's God in Isabelle/HOL''.

    Formalisation and Evaluation of Alan Gewirth's Proof for the Principle of Generic Consistency in Isabelle/HOL

    Get PDF
    An ambitious ethical theory ---Alan Gewirth's "Principle of Generic Consistency"--- is encoded and analysed in Isabelle/HOL. Gewirth's theory has stirred much attention in philosophy and ethics and has been proposed as a potential means to bound the impact of artificial general intelligence
    corecore